Fixing Artifacts

Concepts

Camera Zoom
  • "How much do we magnify the world on top of the base pixel density?"

  • Zoom is a camera parameter, not part of the meter↔pixel definition.

screen_pixels =
    world_meters
    × PIXELS_PER_METER
    × zoom
  • Zoom belongs to the projection matrix, not the view matrix.

  • Multiply by zoom only when applying camera magnification; also inside projection.

Texel (UV in Texel Space)
vec2 texel = uv * tex_size;
  • uv is assumed to be normalized texture coordinates in [0,1].

  • Multiplying by texture size converts from normalized UV space to texel space.

  • Example: if texture is 256×256 and uv = (0.5, 0.5), then texel = (128, 128).

  • Integer part:

    • Which texel.

    • floor(texel)  return the integer part.

      • Ex:

        • (128.3, 64.9) -> (128, 64)

  • Fractional part:

    • where inside the texel

    • fract(texel)  returns the fractional part (x - floor(x)) .

      • Ex:

      • (10.2,   5.7) -> (0.2, 0.7).

      • (128.0, 64.0) -> (0.0, 0.0).

    • Naming:

      • vec2 texel_phase = fract(texel); .

        • In signal processing and wave math, the fractional position within a repeating unit is commonly called the phase.

        • "This is the phase inside the repeating texel grid.".

      • vec2 texel_frac = fract(texel); .

      • vec2 texel_offset = fract(texel); .

Debugging

Fractional screen_pixels_per_texel
  • The only truly stable configuration is screen_pixels_per_texel = integer ; Any fractional scale → shimmer when camera moves.

pixels_per_texel_x = viewport_width / (camera_world_width * texture_width_in_texels)
  • If this is not an integer, you will get shimmer.

Sub-texel Phase
vec2 tex_size = textureSize(texs[idx], 0);
vec2 texel = uv * tex_size;
vec2 texel_phase = fract(texel);
frag_color = vec4(texel_phase, 0.0, 1.0);
  • Good Results:

    • If everything were perfectly stable, you would see:

      • solid colors inside sprites

      • clean step changes at texel boundaries

  • Bad Results:

    • Seeing a “crazy pattern” means:

      • UVs are landing at fractional texel positions

      • and they vary across pixels in a non-uniform way

      • This proves your screen pixels are not mapping 1:1 to texels.

  • Results :

    • (2026-02-15)

      • Normal camera zoom:

        • There's a bunch of big squares that move in the opposite direction of the movement.

      • Floored camera zoom:

        • There's no longer big squares, but now very tiny ones, like pixels.

        • The pattern color changes when moving, but seems more stable than the version before, expect for the fact that often I get some lines across the whole screen when moving, like everything is collapsing.

      • Set camera zoom to 1.0:

        • Huge squares that keep changing color when moving. Sometimes I get lines inside those big squares.

      • Floored camera zoom + floored camera pos + floored sprite pos:

        • With the screen on 1280x720:

          • Everything seems to be in monochromatic tones of green, that slightly change in brightness when moving.

        • With the screen maximized to QuadHD screen (with the window bar on top, and task bar in the bottom):

          • The screen is filled with big horizontal rectangles that flicker when moving.

Fractional Scaling / UV Interpolation
vec2 tex_size = textureSize(texs[idx], 0);
vec2 texel = uv * tex_size;
frag_color = vec4(floor(texel) / 32.0, 0.0, 1.0);
  • Each pixel inside the same texel gets the same value → creates blocky regions.

  • Clean square blocks → UVs are well-behaved

  • Skewed/parallelogram blocks → UV interpolation or projection issues

  • Wobbling/shimmering blocks → precision or derivative problems

  • Results :

    • (2026-02-16)

      • Normal camera zoom / floored camera zoom / camera zoom set to 1.0:

        • Stable pattern, where every sprite seems to have a UV gradient visual, with no change when moving the camera.

Rate of UV change per pixel
vec2 tex_size = textureSize(texs[idx], 0);
vec2 texel = uv * tex_size;
vec2 fw = fwidth(texel);
frag_color = vec4(vec3(length(fw)), 1.0);
  • Good Results:

    length(fw) ≈ 1.0
    
    fw ≈ (1, 1)
    length = sqrt(1² + 1²)
    length(fw) ≈ √2 ≈ 1.414
    
    • Meaning:

      • 1 → minification

      • <1 → magnification

      • non-uniform values → fractional scaling

    • When you display 1.414 in an unclamped color channel:

      • values > 1 are clamped to 1 in the framebuffer

      • but if your actual fw is, say, (0.7, 0.7):

        • length ≈ 0.99 → gray

      • So the color alone is not diagnostic.

  • Results :

    • (2026-02-16)

      • Stable gray color.

      • The gray tone changes depending on the camera zoom, such as with zoom 1.0, the screen is white, and zoom 5.0, the screen is a darkish gray.

Projection Misalignment
frag_color = vec4(fract(gl_FragCoord.xy), 0, 1);
  • Good Results:

    • Perfectly stable pattern

    • No movement when camera moves

  • Bad Results:

    • If it shifts → projection misalignment.

vec2 f = abs(fract(gl_FragCoord.xy) - 0.5);
float dist = max(f.x, f.y);
frag_color = vec4(vec3(dist * 4.0), 1.0);
  • Interpretation

    • dark = pixel center aligned (good)

    • bright = between pixels (bad)

    • If brightness changes when the camera moves → confirmed source.

  • Results :

    • (2026-02-16) With or without camera zoom rounding:

      • Stable green/yellow pattern.

      • No bright changes.

Orthographic Projection Half-pixel Bias
  • Pattern for testing:

    // Frag
    vec2 tex_size = textureSize(texs[idx], 0); 
    vec2 texel_uv = uv * tex_size;
    vec2 phase = fract(texel_uv); 
    frag_color = vec4(phase, 0.0, 1.0);
    
  • Setup 1:

    // CPU
    ppm_x := PIXELS_PER_METER * camera._effective_zoom.x 
    ppm_y := PIXELS_PER_METER * camera._effective_zoom.y 
    camera.proj_matrix = proj_matrix_orthographic_vulkan_classical_z_finite( 
        left = (-f32(camera.origin.x) + 0.5) / ppm_x, 
        right = ( f32(camera.extent.w - camera.origin.x) + 0.5) / ppm_x, 
        top = (-f32(camera.origin.y) + 0.5) / ppm_y, 
        bottom = ( f32(camera.extent.h - camera.origin.y) + 0.5) / ppm_y, 
        near = 0.0, far = 1.0
    ) 
    
  • Setup 2:

    vec2 pixel_size = 2.0 / globals.framebuffer_size;
    gl_Position.xy += pixel_size * 0.5;
    
  • Result :

    • (2026-02-16)

      • Camera zoom floored + sprite pos

      • The pattern with both setups is really chaotic, with a bunch of vertical and horizontal lines.

      • In Vulkan, unlike old D3D9-era pipelines, you usually do NOT need a half-pixel bias if your orthographic matrix is mathematically correct.

Orthographic projection is not enforcing integer pixel mapping
  • Even with:

    • floored camera

    • floored zoom

    • nearest filtering

  • You will still shimmer if your projection allows fractional screen coverage.

HiDPI / OS compositor scaling
  • They must match exactly:

    • swapchain extent

    • framebuffer size

    • viewport size

    • window size

    • orthographic projection dimensions

  • Results :

    • With the screen on 1280x720:

      window.extent:           Extent{w = 1280, h = 720} 
      window.swapchain.extent: Extent2D{width = 1280, height = 720} 
      glfw.GetWindowSize:      (1280 720) 
      glfw.GetWindowFrameSize: (8, 31, 8, 8) 
      glfw.GetFramebufferSize: (1280 720) 
      monitor_size:            Extent{w = 2560, h = 1440}
      
    • With the screen maximized to QuadHD screen (with the window bar on top, and task bar in the bottom):

      window.extent:           Extent{w = 2560, h = 1377} 
      window.swapchain.extent: Extent2D{width = 2560, height = 1377} 
      glfw.GetWindowSize:      (2560 1377) 
      glfw.GetWindowFrameSize: (8, 31, 8, 8) 
      glfw.GetFramebufferSize: (2560 1377) 
      monitor_size:            Extent{w = 2560, h = 1440}
      
Vertex Positions after transform
  • Even if you snap sprites in world space, after projection they may land between pixels.

// Vertex
vec4 clip = proj * view * model * vec4(pos, 1.0);
vec2 ndc = clip.xy / clip.w;
debug_screen = (ndc * 0.5 + 0.5) * framebufferSize;
gl_Position = clip;

// Frag
frag_color = vec4(fract(debug_screen), 0.0, 1.0);
  • Good Results:

    • Nearly constant color per triangle.

  • Bad Results:

    • Noisy or shifting pattern → subpixel positioning confirmed.

    • If not near integer pixel centers → that’s the shimmer source.

  • Results :

    • (2026-02-16)

      • With or without round camera zoom:

        • Billions of stable squares with gradient colors; some of these squares:

          • .

        • Seeing a grid of stable squares means:

          • derivatives are constant per pixel quad (expected)

          • but not equal to 1

          • and vary spatially

        • This almost always indicates:

          • Screen pixels do not map 1:1 to texels

        • More specifically:

          • each screen pixel footprint spans a fractional number of texels

          • so nearest filtering flips between neighbors as the camera moves

Fractional Viewport
viewport.x = 0
viewport.y = 0
viewport.width = framebuffer_width
viewport.height = framebuffer_height
  • Fractional viewport → shimmer.

Sampler
  • Must be:

minFilter = NEAREST
magFilter = NEAREST
mipmapMode = NEAREST
  • CLAMP_TO_EDGE

  • no mipmaps

  • no anisotropy

Solutions proposal

  • General recommendations :

    • Use texelFetch whenever possible.

      • texelFetch(sampler2D, ivec2(coord), 0) is the most predictable way to read a texel without sampler filtering or dependency on derivatives. Combine with integer coords: ivec2(floor(uv * tex_size)).

      • This avoids fwidth/derivative pitfalls and is cheap and deterministic.

      • texelFetch  is typically cheaper and avoids derivative hardware work. fwidth and smoothstep per-fragment add cost.

    • Rotations / scaling:

      • Pixel-perfect nearest sampling + rotation is inherently problematic. If you need rotated pixel-art, either pre-rotate frames (artwork side) or accept interpolation/AA. The derivative-based AA solutions reduce shimmering but change the crisp look.

  • GPU: Use .CLAMP_TO_EDGE .

    • General recommendation to avoid texture bleeding, or add padding to atlases.

  • GPU: Anisotropy OFF :

    • (2026-02-15) I'm not sure if this helps or not, but it was said to leave it off, so I did.

  • GPU: Filter NEAREST .

  • GPU: No mipmaps .

  • CPU: Snap the projection matrix, via snap the camera zoom :

    camera._effective_zoom.x = math.round(camera._effective_zoom.x)
    camera._effective_zoom.y = math.round(camera._effective_zoom.y)
    
    • (2026-02-15) This helps.

  • CPU: Snap the camera :

    • Use camera_pos = round(camera_pos * render_upscale) / render_upscale , to make a coordinate snap for camera.target .

    • camera_pos = floor(camera_pos * pixels_per_unit) / pixels_per_unit; .

    • I used to use this in RayLib. It helped a bit.

    • (2026-02-15)

      • I have to snap the camera as well, otherwise the main character will flicker in the direction of the movement. This is expected, as I'm just snapping the matrix of the sprite when drawing, not its actual .pos , so the camera uses the unsnapped position for _game.camera.pos = dyn_body_trans.pos .

      • Floored sprites, floored camera, floored zoom:

        • Sub-texel phase pattern:

          • The pattern was improved.

  • CPU: Snap geometry :

    mat := mat
    mat[0, 2] = (math.floor(mat[0, 2] * PIXELS_PER_METER * 5.0) / 5.0) / PIXELS_PER_METER
    mat[1, 2] = (math.floor(mat[1, 2] * PIXELS_PER_METER * 5.0) / 5.0) / PIXELS_PER_METER
    
    • (2026-02-15)

      • I have to snap the camera as well, otherwise the main character will flicker in the direction of the movement. This is expected, as I'm just snapping the matrix of the sprite when drawing, not its actual .pos , so the camera uses the unsnapped position for _game.camera.pos = dyn_body_trans.pos .

      • Using both "Snap geometry" and "snap the camera" creates some gaps between sprites, which is not good. Either way, the shimmering doesn't seem to improve.

      • Floored sprites, floored camera, floored zoom:

        • Sub-texel phase pattern:

          • The pattern was improved.

  • Fragment Shader: uv_aa_linear :

    • uv_aa_linear

      • Source: https://www.shadertoy.com/view/ltBfRD

      vec2 tex_size = textureSize(texs[idx], 0);
      vec2 texel = uv * tex_size; 
      vec2 texel_floor = floor(texel + 0.5);
      float width = 1.0;  // Options: 1 pixel filter, 1.5 pixel filter, 2.0 pixel filter
      uv = (texel_floor + clamp((texel - texel_floor) / fwidth(texel) / width, -0.5, 0.5)) / tex_size;
      albedo = texture(texs[idx], uv);
      
      • (2026-02-15)

        • With nearest filtering:

          • Floored sprites, floored camera, floored zoom:

            • No weird edges around the sprites.

            • Sub-texel phase pattern:

              • The pattern was improved. The green flickers less when moving.

        • With linear filtering:

          • Floored sprites, floored camera, floored zoom:

            • Gives a weird edge around sprites.

            • Sub-texel phase pattern:

              • The pattern was improved. The green flickers less when moving.

    • uv_aa_smoothstep :

      • Source: https://www.shadertoy.com/view/ltBfRD

      vec2 tex_size = textureSize(texs[idx], 0);
      vec2 texel = uv * tex_size; 
      vec2 texel_floor = floor(texel + 0.5);
      vec2 texel_fract = fract(texel + 0.5);
      float width = 1.0;  // Options: 1 pixel filter, 1.5 pixel filter, 2.0 pixel filter
      vec2 texel_aa = fwidth(texel) * width * 0.5;
      texel_fract = smoothstep(
          vec2(0.5) - texel_aa,
          vec2(0.5) + texel_aa,
          texel_fract
      );
      uv = (texel_floor + texel_fract - 0.5) / tex_size;
      albedo = texture(texs[idx], uv);
      
      • (2026-02-15)

        • With nearest filtering:

          • Floored sprites, floored camera, floored zoom:

            • Gives a weird edge around sprites.

            • Sub-texel phase pattern:

              • The pattern was improved. The green flickers less when moving.

        • With linear filtering:

          • Floored sprites, floored camera, floored zoom:

            • Gives a weird edge around sprites.

            • Sub-texel phase pattern:

              • The pattern was improved. The green flickers less when moving.

    • What they do

      • Use fwidth  to create an anti-aliased transition zone across texel boundaries; one uses a clamped linear blend, the other uses smoothstep.

    • Pros

      • Reduced shimmering and softer transitions when the texture mapping footprint covers multiple texels.

      • Smoothstep gives visually nicer blending with fewer harsh transitions.

    • Cons

      • More expensive per-fragment.

      • Requires careful tuning of width  and correct handling of fwidth == 0 .

      • Blending breaks "pure" nearest look; you get soft edges that may not fit some pixel-art aesthetics.

    • When to use

      • If you permit a small amount of filtering/anti-aliasing to reduce shimmering for rotated/scaled sprites where snapping is not viable.

  • Vertex Shader: Inset UVs :

    • Purpose :

      • Prevents sampling outside the sprite region in the atlas.

    • It fixes :

      • atlas edge bleed

      • neighbor tile leakage

      • precision at UV borders

    • It does not fix :

      • shimmering during camera motion

      • subpixel raster instability

      • pixel-perfect alignment issues

    • Use it if :

      • sprites come from a tight atlas

      • you use linear filtering (especially)

        • With nearest filtering, this is usually secondary but still good hygiene.

      • you see seams even when camera is static

      • tiles touch in the atlas

    • Vertex Shader:

      // Original UV
      frag_tex_coord = (vertex_tex_coord * pc.uv_scale) + pc.uv_pos;
      
      // Adding Inset to UV
      uint idx = nonuniformEXT(pc.tex_idx);
      if (idx != NO_TEXTURE) {
          vec2 tex_size = textureSize(texs[idx], 0);
          // Half-texel in UV space
          vec2 half_texel = 0.5 / tex_size;
          // Apply symmetric inset
          frag_tex_coord += mix(half_texel, -half_texel, vertex_tex_coord);
      }
      
      • (2026-02-15)

        • For texture bleeding:

          • This fixed the issue, without other solutions required.

        • With nearest filtering:

          • Floored sprites, floored camera, floored zoom:

            • Sub-texel phase pattern:

              • The pattern showed a lot of big squares with different colors.

              • The pattern was therefore worsen.

        • With linear filtering:

          • Floored sprites, floored camera, floored zoom:

            • Everything looks blurred.

  • Vertex Shader: Position snapping in clip space :

    • Purpose :

      • Ensures rasterization happens on exact pixel boundaries.

    • It fixes :

      • seams that appear only while moving

      • nearest-filter flicker

      • pixel-perfect instability

      • subpixel camera artifacts

    • It does not fix :

      • atlas bleed from bad UVs

      • mip bleeding

      • repeat wrap leakage

    • Snaps vertex positions in clip/NDC → pixel space to pixel centers.

    • Pros

      • Keeps the whole sprite aligned to the pixel grid (prevents sub-pixel motion of the quad as a whole).

      • Very cheap and deterministic.

    • Cons / caveats

      • If you snap per-vertex (without treating the quad as a single unit) you can distort the quad (vertices snap differently) — do snapping per-instance / per-sprite origin so the whole quad shifts the same amount.

      • Rotations and non-uniform scaling: snapping in screen-space will quantize the transformed geometry and can produce visually wrong shapes for rotated/scaled sprites.

      • Must keep UV mapping consistent with the snapped transform (otherwise texture appears offset by fractional pixels).

    • When to use

      • Static, axis-aligned sprites or translations where you can snap the sprite origin once (best if computed per-instance on CPU or in vertex shader once per sprite).

    • Practical change

      • Snap the sprite origin (or instance translation) rather than each quad vertex. That keeps the quad intact.

    • "Since you are doing 2D with nearest, you should snap positions, not UVs."

      • In the vertex shader after projection (or before), snap to pixel grid.

      vec4 vertex_pos_cs = globals.proj_matrix * vertex_pos_vs;
      vertex_pos_cs.xy = floor(vertex_pos_cs.xy * viewport_size) / viewport_size;
          // Snapping positions.
          // convert to pixel space, snap, convert back
      gl_Position = vertex_pos_cs;
      
      • The safer pixel snap (post-projection) is:

        • Without the proper transform, snapping can be slightly off.

        • Version 1:

          vec4 vertex_pos_cs = globals.proj_matrix * vertex_pos_vs;
          
          // convert NDC → pixel space
          vec2 pixel_pos = (vertex_pos_cs.xy * 0.5 + 0.5) * viewport_size;
          
          // snap
          pixel_pos = floor(pixel_pos) + 0.5;
          
          // back to NDC
          vertex_pos_cs.xy = ((pixel_pos / viewport_size) * 2.0 - 1.0);
          
          gl_Position = vertex_pos_cs;
          
          • (2026-02-15) Gives a crazy wobble in the whole screen.

            • Problem: vertex_pos_cs  is different for each vertex.

            • That deforms the quad slightly every frame as rounding flips.

            • You should snap once per sprite origin; do not snap per vertex.

        • Version 2:

          • For when the world coordinates are in pixels.

          // Origin
          vec3 origin_pos3_ws = pc.model_matrix * vec3(0.0, 0.0, 1.0);
          vec4 origin_pos_ws  = vec4(origin_pos3_ws.x, origin_pos3_ws.y, 0.0, origin_pos3_ws.z);
          vec4 origin_pos_vs  = globals.view_matrix * origin_pos_ws;
          vec4 origin_pos_cs  = globals.proj_matrix * origin_pos_vs;
          vec2 origin_pos_ps  = (origin_pos_cs.xy * 0.5 + 0.5) * globals.viewport_size;
              // Pixel Space
          
          // snap ONCE per sprite
          origin_pos_ps = floor(origin_pos_ps) + 0.5;
          
          vec2 vertex_pos_ps = (vertex_pos_cs.xy * 0.5 + 0.5) * globals.viewport_size;
              // Pixel Space
          
          // offset in pixels from origin
          vec2 offset_ps = vertex_pos_ps - ((origin_pos_cs.xy * 0.5 + 0.5) * globals.viewport_size);
          // rebuild final pixel position
          vec2 final_ps = origin_pos_ps + offset_ps;
          // back to NDC
          vec2 final_ndc = (final_ps / globals.viewport_size) * 2.0 - 1.0;
          
          gl_Position = vec4(final_ndc, vertex_pos_cs.z, vertex_pos_cs.w);
          frag_pos_ws = vec2(vertex_pos_ws);
          
          • (2026-02-15) Wabbles differently, but still wabbles.

        • Version 3:

          • For when the world coordinates are in pixels.

          vec2 vertex_pos_ps = vertex_pos_vs.xy * PIXELS_PER_METER;
          vertex_pos_ps = floor(vertex_pos_ps + 0.5); // round to nearest pixel
          vertex_pos_vs.xy = vertex_pos_ps / PIXELS_PER_METER;
          
          vec4 vertex_pos_cs  = globals.proj_matrix * vertex_pos_vs;
          gl_Position = vertex_pos_cs;
          
        • Version 4:

          • Snap in clip space.

          vertex_pos_cs.xy = floor(vertex_pos_cs.xy * globals.viewport_size) / globals.viewport_size;
          
          • (2026-02-15) Wabbles.

        • Version 5:

          // convert to NDC
          vec2 ndc = vertex_pos_cs.xy / vertex_pos_cs.w;
          
          // snap in pixel space
          vec2 pixel = (ndc * 0.5 + 0.5) * globals.viewport_size;
          pixel = floor(pixel);
          ndc = (pixel / globals.viewport_size) * 2.0 - 1.0;
          
          // back to clip
          vertex_pos_cs.xy = ndc * vertex_pos_cs.w;
          
          • (2026-02-15) Wabbles.

        • Version 6:

          vec2 fb = globals.framebuffer_size; // (width, height)
          // clip -> NDC
          vec2 ndc = vertex_pos_cs.xy / vertex_pos_cs.w;
          // NDC -> pixel coords
          vec2 pixel = (ndc * 0.5 + 0.5) * fb;
          // snap
          pixel = floor(pixel + 0.5);
          // pixel -> NDC
          ndc = (pixel / fb - 0.5) * 2.0;
          // write back to clip space
          vertex_pos_cs.xy = ndc * vertex_pos_cs.w;
          
          • (2026-02-15) Wabbles.

      • Many engines do this (wrong):

        • snap each vertex independently

        • This causes:

          • quad distortion

          • UV drift

          • shimmering

        • Make sure snapping happens once per sprite, not per vertex.

  • Fragment Shader: AA Point Sampling, simple fract/min :

    • Source: https://www.shadertoy.com/view/MlB3D3

    uint idx = nonuniformEXT(pc.tex_idx);
    vec2 tex_size = textureSize(texs[idx], 0);
    vec2 texel = uv * tex_size; 
    vec2 pix = floor(texel) + min(fract(texel) / fwidth(texel), vec2(1.0)) - vec2(0.5);
    uv =  pix / tex_size;
    albedo = texture(texs[idx], uv);
    
    • What it does

      • pix = floor(tex_pos) + min(fract(tex_pos), 1.0) - 0.5  — effectively ends up as tex_pos - 0.5  when fract < 1, so it’s not a true nearest calculation.

      • It’s simpler but not adaptive.

    • Pros

      • Simple.

    • Cons

      • Doesn’t use derivatives, so it won’t prevent flicker when the subpixel position crosses texel boundaries.

      • It’s effectively sampling at tex_pos - 0.5  which can be surprising at the edges and for atlases.

    • When to use

      • None recommended for clean pixel-art nearest sampling.

    • (2026-02-15)

      • With nearest filtering:

        • Floored sprites, floored camera, floored zoom:

          • Things look skinnier and weird.

          • Sub-texel phase pattern:

            • No improvement to the pattern.

      • With linear filtering:

        • Floored sprites, floored camera, floored zoom:

          • Things look blurry, like the usual when using linear filtering.

          • Sub-texel phase pattern:

            • No improvement to the pattern.

  • Fragment Shader: Samples at texel centers — canonical nearest sampling implemented in shader :

    • Pros

      • Simple, explicit nearest sampling; deterministic.

      • Matches what texelFetch would return if you convert coords to integer indices.

    • Cons

      • When UVs are interpolated across a triangle, different fragments can floor to different texels as the sprite moves sub-pixel, producing the expected discrete jumps (this is the shimmering you see unless geometry is pixel-aligned).

      • Still uses texture() (filtered sampler), so hardware filtering/mip settings must be consistent (use nearest sampler or disabled mipmaps).

    • When to use

      • Good if you must use texture() for arrayed bindless samplers. Prefer texelFetch if available.

    • Source: https://www.shadertoy.com/view/ltBfRD

    uint idx = nonuniformEXT(pc.tex_idx);
    vec2 tex_size = textureSize(texs[idx], 0);
    uv = uv * tex_size;
    albedo = texture(texs[idx], (floor(uv) + 0.5) / tex_size);
    
    • (2026-02-15) Didn't help at all.

      • After snapping the projection matrix, the result is the same.

    • Samples at texel centers with texelFetch  — canonical nearest sampling implemented in shader :

      • It's a better alternative.

      uint idx = nonuniformEXT(pc.tex_idx);
      vec2 tex_size = textureSize(texs[idx], 0);
      ivec2 coord = ivec2(floor(uv * tex_size));
      albedo = texelFetch(texs[idx], coord, 0);
      
      • texelFetch  samples exact texel with integer coords and avoids interpolation/derivative issues.

      • (2026-02-15).

        • With nearest filtering:

          • Floored sprites, floored camera, floored zoom:

            • Things look skinnier and weird.

            • Sub-texel phase pattern:

              • No improvement to the pattern.

        • With linear filtering:

          • Floored sprites, floored camera, floored zoom:

            • Things look blurry, like the usual when using linear filtering.

            • Sub-texel phase pattern:

              • No improvement to the pattern.

  • Fragment Shader: fwidth  in texel space to blend a texel :

    • Source: ChatGPT

    uint idx = nonuniformEXT(pc.tex_idx);
    vec2 tex_size = textureSize(texs[idx], 0);
    vec2 tex_pos = uv * tex_size; 
    vec2 fw = max(fwidth(tex_pos), vec2(1e-6));
    vec2 pix = floor(tex_pos) + min(fract(tex_pos) / fw, vec2(1.0)) - vec2(0.5);
    albedo = texture(texs[idx], pix / tex_size);
    
    • What it does

      • Uses the fragment footprint ( fwidth ) in texel units to choose/blend a texel and avoid divide-by-zero.

      • Compute derivatives in texel space and avoid divide-by-zero

      • Choose the nearest texel but allow a small smoothing based on derivatives

    • Pros

      • Adapts to the footprint size; can reduce popping during scale changes and handle partial coverage better than a hard floor.

      • Safer when derivatives are non-zero and indicate minification/magnification.

    • Cons / gotchas

      • fwidth can be zero for some primitives (flat varyings) — you handled that with max(eps), but that’s an approximation.

      • If transform is anisotropic (very different x/y scales), fwidth can be misleading unless handled per-axis.

      • More expensive than integer fetch; tuning needed (eps, clamp ranges).

      • If textures come from different samplers or textures change across the primitive, derivatives become unreliable.

    • When to use

      • When you need smooth transitions across varying scales and cannot rely on snapping + texelFetch .

      • Good for dynamic scale/rotation where you want some anti-aliasing.

    • (2026-02-15)

      • With nearest filtering:

        • Floored sprites, floored camera, floored zoom:

          • Sub-texel phase pattern:

            • The pattern gets worse, with a lot of lines in the screen.

      • With linear filtering:

        • Floored sprites, floored camera, floored zoom:

          • Sub-texel phase pattern:

            • The pattern gets worse, with a lot of lines in the screen.

  • CPU: Share geometry edges :

    • When building tile meshes, reuse the same vertex for adjacent tiles so FP rounding can’t create tiny gaps

  • CPU: Extend the Source/Compress the Dest :

    • To compensate for that, you sometimes need to extend the geometry out by a small bit to kind of force the rounding to happen differently. In the cases I have seen with this issue, extending the tile by half to a quarter of a unit helps the GPU round it in a better way.

    • I used values of: 1px , 0.5px , 0.25px , 0.125px , 0.02px  of compression/expansion on all sides.

    • I tested:

      • Compressing the source.

        • It made the visuals look "bubbly" and very weird at low resolutions.

          • This happens even with values of 0.02 .

        • At high resolutions the effect also existed, although smaller.

        • The overall impression is that the pixel art effect was lost and it looked very bad.

      • Expanding the source.

        • Made no sense.

      • Compressing the destination.

        • The effect looks very similar to "compressing the source", but worse:

          • Sometimes one sprite was drawn in front of another.

          • Sometimes the sprites left the "world grid" and also gave the impression of breaking the pixel art.

      • Expanding the destination.

        • Made no sense.

    • The result was not good.

Texture Bleeding

Problem demo
  • (2026-02-15)

    • In Vulkan, using tilesets.

    • .

  • (2025-06-26)

    • In RayLib, using tilesets.

Explanation
  • Old answer from the RayLib discord: "The core issue is that RayLib uses OpenGL, so everything you are drawing is 3d geometry. That geometry is transformed by 3d matrices as part of the camera and view transforms. Due to the nature of floating points there can be small rounding issues during that transform, so you end up with a 1 pixel gap between things. It is also compounded by the fact that GPUs and GPU drivers handle all that math and sometimes work slightly differently. If this was all done with raw pixels, it would be much simpler, but sadly GPUs don't work that way anymore."

Pixel Art Shimmering

  • Future ideas :

    • If texture bleeding happens:

      • Even with sprite and camera snapping there was gaps. This is odd...

        • Could this be caused by the quad size when it's created?

          • Check the dimensions.

          • Is it maybe simply due to random float imprecisions?

      • Consider rendering in pixels and not meters, while having the positions, etc, in meters.

  • (2026-02-16) Applied solutions :

    • CPU: Snap the projection matrix, via snap the camera zoom

    • Fragment Shader: uv_aa_linear

    • Current visual :

      • Ok.

  • (2026-02-15) Applied solutions :

    • CPU: Snap the projection matrix, via snap the camera zoom

    • CPU: Snap the camera

    • CPU: Snap geometry

    • CPU snapping:

      • Result:

    • Fragment Shader: uv_aa_linear

    • Current visual :

      • There's a "vibration" when moving a character, probably due to how the camera is being snapped into place.

        • Smoothing the camera movement probably brings back the shimmering.

      • Some sprites now have gaps, due to sprite snapping.

Solutions proposed
  • Shimmering Artifact in Pixel Art .

    • The idea behind the filtering is simple: apply nearest  filter to pixels that are fully inside a texel, and use bilinear  with custom weights at the edges.

    • That is the core idea.

    • The tricky part is creating custom weights for the bilinear filter to work, and that is why there are so many different filters.

    • The rest of the article is: We will now see the different filters included in the testbed. Nothing that I wrote, so I will include links to the original material and you can see the authors' work.

    • See the links at the end of the article to view the analyzed shaders.

      • https://www.shadertoy.com/view/MllBWf

      • https://www.shadertoy.com/view/ltfXWS

      • https://www.shadertoy.com/view/4dlXzB

      • https://www.shadertoy.com/view/ltBGWc / https://www.shadertoy.com/view/MlB3D3

      • https://www.shadertoy.com/view/ltBfRD

      • https://www.shadertoy.com/view/ltcGDX

        • scale/rotate a pixel-art-piece while preserving the chunky pixels (zoomed in view to highlight the effect of the filtering)

    • The article references Handmade Hero#Chat 18 - Shimmering artifact in Pixel Art and nSight Shader Analysis  and comments on the shader made.

  • Handmade Hero#Chat 18 - Shimmering artifact in Pixel Art and nSight Shader Analysis .

  • Removing the shimmering artifact from Pixel Art in 3D .

    • Well explained.

    • Unlike the videos above, this considers that Pixel Art can be rotated in 3D.

    • He uses mipmap and premultiplied alpha.

    • Final shader:

      • .

Pixel Art Camera Smoothness